The Entropy Rate of the Hidden Markov Process
نویسنده
چکیده
A stochastic process which is a noisy observation of a Markov process through a memoryless channel is called a hidden Markov process (HMP). Finding the entropy rate of HMP is motivated by applications in both stochastic signal processing and information theory. The first expression for the entropy rate of HMP is found in 1957 [1]. This expression is defined through a measure described by an integral equation which is hard to extract from the equation in any explicit way. Besides a known bound on the entropy rate for the general case, simple expressions for the entropy rate has been recently obtained for special cases where the parameters of hidden Markov source approaches zero [2],[3]. In this paper the entropy rate is defined by a measure as the stationary distribution of a Markov process with specific initial distribution and conditional probabilities. This allows exact computation of state distribution for any time index, and eventually the measure. Although the state space of the associated Markov process is a continuous set, we have shown that its distribution is a probability mass function, which leads to a simple algorithm to generate a sequence of numbers that converges monotonically from above to the entropy rate. A Hidden Markov Process {Zn}∞n=0 is defined by a quadruple [S, P,Z, T ], where S, P are the state set and transition probability matrix of a Markov process and Z, T is the observation set and the observation probability Matrix |S| × |Z|, respectively. By defining the random variable qn(Zn−1) over the |Z|-dimensional probability simplex (denoted by ∇Z) with components qn[k] = Pr(Zn = k|Zn−1), we show that the entropy rate is Ĥ(Z) = limn→∞E[h(qn)], where h(.) is the entropy function over ∇Z . Since {qn}∞n=0 is not a Markov process, the measure for evaluating this expectation is not computable. However if we consider the information-state process {πn}∞n=0, which is related to {qn}∞n=0 by the transformation qn = q(πn) = πn × T , the entropy rate can be formulated as
منابع مشابه
Taylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملRelative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
متن کاملThe Rate of Rényi Entropy for Irreducible Markov Chains
In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.
متن کاملADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes
In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...
متن کاملAn efficient algorithm for the entropy rate of a hidden Markov model with unambiguous symbols
We demonstrate an efficient formula to compute the entropy rate H(μ) of a hidden Markov process with q output symbols where at least one symbol is unambiguously received. Using an approximation to H(μ) to the first N terms we give a O(Nq) algorithm to compute the entropy rate of the hidden Markov model. We use the algorithm to estimate the entropy rate when the parameters of the hidden Markov m...
متن کاملEstimation of the Entropy Rate of ErgodicMarkov Chains
In this paper an approximation for entropy rate of an ergodic Markov chain via sample path simulation is calculated. Although there is an explicit form of the entropy rate here, the exact computational method is laborious to apply. It is demonstrated that the estimated entropy rate of Markov chain via sample path not only converges to the correct entropy rate but also does it exponential...
متن کامل